Mutual Information Functions Versus Correlation Functions
نویسنده
چکیده
This paper studies one application of mutual information to symbolic sequence: the mutual information function M(d). This function is compared with the more frequently used correlation function (d). An exact relation between M(d) and (d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular, (d) = 0 may or may not lead to M(d) = 0. This linear, but not general, independence between symbols separated by a distance is studied for ternary sequences. Also included in this paper is the estimation of the nite-size e ect on calculating mutual information. Finally, the concept of \symbolic noise" is discussed.
منابع مشابه
Spatiotemporal dynamics of the magnetosphere during geospace storms: Mutual information analysis
[1] The magnetospheric response to strong driving by the solar wind has spatial variations, and corresponding data are essential for the understanding of the spatiotemporal dynamics. A database of ISEE3 and IMP8 spacecraft and ground-based magnetometer data from high-latitude stations (Kamide et al., 1998) is used to study the magnetospheric response to solar wind variables during geospace stor...
متن کاملMeasuring Correlations in Protein Sequences
The paper is devoted to the detection of pair correlations in amino acid sequences using mutual information and correlation functions. Since statistical dependences are relatively weak, nite sample corrections turn out to be essential for a correct interpretation of the correlation measures. Statistical uctuations are reduced by analyzing two large sets of protein sequences. The calculation of ...
متن کاملControl of mutual spatial coherence of temporal features by reflexive photorefractive coupling
We analyze reflexive photorefractive coupling of an information-bearing beam carrying several distinct spatiotemporal features, with emphasis on the coupling-induced change in the mutual spatial coherence. We formulate equations describing evolution of the mutual correlation functions between different features and discuss their solutions both in the full two-dimensional case and in the limit l...
متن کاملEfficient Entropy Estimation for Mutual Information Analysis Using B-Splines
The Correlation Power Analysis (CPA) is probably the most used side-channel attack because it seems to fit the power model of most standard CMOS devices and is very efficiently computed. However, the Pearson correlation coefficient used in the CPA measures only linear statistical dependences where the Mutual Information (MI) takes into account both linear and nonlinear dependences. Even if ther...
متن کاملBoolean functions: noise stability, non-interactive correlation, and mutual information
Let ǫ ∈ [0, 1/2] be the noise parameter and p > 1. We study the isoperimetric problem that for fixed mean Ef which Boolean function f maximizes the p-th moment E(Tǫf) p of the noise operator Tǫ acting on Boolean functions f : {0, 1} n 7→ {0, 1}. Our findings are: in the low noise scenario, i.e., ǫ is small, the maximum is achieved by the lexicographical function; in the high noise scenario, i.e...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1990